An implementable lossy version of the Lempel-Ziv algorithm - Part I: Optimality for memoryless sources
نویسنده
چکیده
A new lossy variant of the Fixed-Database Lempel–Ziv coding algorithm for encoding at a fixed distortion level is proposed, and its asymptotic optimality and universality for memoryless sources (with respect to bounded single-letter distortion measures) is demonstrated: As the database size m increases to infinity, the expected compression ratio approaches the rate-distortion function. The complexity and redundancy characteristics of the algorithm are comparable to those of its lossless counterpart. A heuristic argument suggests that the redundancy is of order (log log m)= log m, and this is also confirmed experimentally; simulation results are presented that agree well with this rate. Also, the complexity of the algorithm is seen to be comparable to that of the corresponding lossless scheme. We show that there is a tradeoff between compression performance and encoding complexity, and we discuss how the relevant parameters can be chosen to balance this tradeoff in practice. We also discuss the performance of the algorithm when applied to sources with memory, and extensions to the cases of unbounded distortion measures and infinite reproduction alphabets.
منابع مشابه
An Implementable Lossy Version of the Lempel - Ziv Algorithm { Part I : Optimality for Memoryless
{ A new lossy variant of the Fixed-Database Lempel-Ziv coding algorithm for encoding at a xed distortion level is proposed, and its asymptotic optimality and universality for memoryless sources (with respect to bounded single-letter distortion measures) is demonstrated: As the database size m increases to innnity, the expected compression ratio approaches the rate-distortion function. The compl...
متن کاملComplexity-compression tradeoffs in lossy compression via efficient random codebooks and databases
The compression-complexity trade-off of lossy compression algorithms that are based on a random codebook or a random database is examined. Motivated, in part, by recent results of Gupta-VerdúWeissman (GVW) and their underlying connections with the pattern-matching scheme of Kontoyiannis’ lossy Lempel-Ziv algorithm, we introduce a non-universal version of the lossy Lempel-Ziv method (termed LLZ)...
متن کاملOptimal Lempel-Ziv based lossy compression for memoryless data: how to make the right mistakes
Compression refers to encoding data using bits, so that the representation uses as few bits as possible. Compression could be lossless: i.e. encoded data can be recovered exactly from its representation) or lossy where the data is compressed more than the lossless case, but can still be recovered to within prespecified distortion metric. In this paper, we prove the optimality of Codelet Parsing...
متن کاملLossy Compression in Near-Linear Time via Efficient Random Codebooks and Databases
The compression-complexity trade-off of lossy compression algorithms that are based on a random codebook or a random database is examined. Motivated, in part, by recent results of GuptaVerdú-Weissman (GVW) and their underlying connections with the pattern-matching scheme of Kontoyiannis’ lossy Lempel-Ziv algorithm, we introduce a non-universal version of the lossy LempelZiv method (termed LLZ)....
متن کاملA Lossy Data Compression Based on String Matching: Preliminary Analysis and Suboptimal Algorithms
A practical suboptimal algorithm (source coding) for lossy (non-faithful) data compression is discussed. This scheme is based on an approximate string matching, and it naturally extends lossless (faithful) Lempel-Ziv data compression scheme. The construction of the algorithm is based on a careful probabilistic analysis of an approximate string matching problem that is of its own interest. This ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 45 شماره
صفحات -
تاریخ انتشار 1999